# Korean text generation
Lm3 8 Bnb 4bit V1.0
Korean text generation model based on Llama 2 architecture, specialized for Korean natural language processing tasks
Large Language Model
Transformers Korean

L
haes95
18
1
Mistral Ko 7B V0.1
A Korean-optimized version based on the Mistral model, using a Korean-optimized tokenizer and fine-tuned with the Synatra dataset
Large Language Model
Transformers Korean

M
maywell
1,665
14
Mini Synatra SFT
Mini_Synatra_SFT is a Korean text generation model based on Mini_synatra_7b_02 with instruction fine-tuning, supporting ChatML format for instruction interaction.
Large Language Model
Transformers Korean

M
maywell
1,851
3
Koalpaca Polyglot 12.8B
Apache-2.0
A multilingual Korean text generation model fine-tuned on the Korean Alpaca dataset v1.1b based on EleutherAI/polyglot-ko-12.8b
Large Language Model
Transformers Korean

K
beomi
3,998
56
Koalpaca Polyglot 5.8B
Apache-2.0
Korean language model fine-tuned on the KoAlpaca v1.1b dataset based on polyglot-ko-5.8b
Large Language Model
Transformers Korean

K
beomi
9,571
64
Kogpt J 350m
MIT
A Korean text generation model based on the GPT-J architecture with 350 million parameters, suitable for various Korean text generation tasks.
Large Language Model Korean
K
heegyu
123
7
T5 V1 1 Base Ko
Apache-2.0
T5 1.1 model trained on Korean corpus, utilizing BBPE technology and MeCab morpheme analysis for optimized tokenization
Large Language Model Korean
T
team-lucid
18
3
Polyglot Ko 1.3b
Apache-2.0
Polyglot-Ko is one of the Korean autoregressive language model series developed by EleutherAI's multilingual team, containing 1.3 billion parameters and specifically optimized for Korean.
Large Language Model
Transformers Korean

P
EleutherAI
121.13k
83
Kogpt2
KoGPT2 is a Korean generative pre-trained model based on the Huggingface Transformers framework, developed and open-sourced by SKT-AI.
Large Language Model
Transformers

K
taeminlee
1,978
2
Featured Recommended AI Models